Lecture presented by Benoît Crabbé (LLF).
The Large Language Models introduced in the recent years have been found extremely helpful to advance the state-of-the-art in many Natural Language Applications, notably due to their ability to compute numerical, high-dimensional, representations of linguistic units such as words or sentences. Multilingual language models go one step further and add the ability to handle multiple languages, sometimes even multiple scripts, with just one single model. In this presentation, I will discuss multilingual language models at length, how they are typically learned and used, with a focus on the measurement of their multilingual abilities. The main question I will thus try to answer is "what does it mean for a multilingual model X to cover language Y ?".
Mots clés : gigamodels ia informatique jsalt workshop
Informations
- Gregor Dupuy
- 4 juillet 2023 12:07
- Conférence
- Anglais
- Master